Nearly Consistent Finite Particle Estimates in Streaming Importance Sampling

نویسندگان

چکیده

In Bayesian inference, we seek to compute information about random variables such as moments or quantiles on the basis of {available data} and prior information. When distribution is {intractable}, Monte Carlo (MC) sampling usually required. {Importance a standard MC tool that approximates this unavailable with set weighted samples.} This procedure asymptotically consistent number samples (particles) go infinity. However, retaining infinitely many particles intractable. Thus, propose way only keep \emph{finite representative subset} their augmented importance weights \emph{nearly consistent}\blue{, i.e., they converge close neighborhood population ground truth in large sample limit.} To do so {an online manner}, (1) embed posterior density estimate reproducing kernel Hilbert space (RKHS) through its mean embedding; (2) sequentially project RKHS element onto lower-dimensional subspace using maximum discrepancy, an integral probability metric. Theoretically, establish scheme results bias determined by compression parameter, which yields tunable tradeoff between consistency memory. experiments, observe compressed estimates achieve comparable performance dense ones substantial reductions representational complexity.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Importance Sampling Estimates for Policies with Memory

Importance sampling has recently become a popular method for computing off-policy Monte Carlo estimates of returns. It has been known that importance sampling ratios can be computed for POMDPs when the sampled and target policies are both reactive (memoryless). We extend that result to show how they can also be efficiently computed for policies with memory state (finite state controllers) witho...

متن کامل

Nearly Optimal Verifiable Data Streaming

The problem of verifiable data streaming (VDS) considers the setting in which a client outsources a large dataset to an untrusted server and the integrity of this dataset is publicly verifiable. A special property of VDS is that the client can append additional elements to the dataset without changing the public verification key. Furthermore, the client may also update elements in the dataset. ...

متن کامل

Local Importance Sampling: A Novel Technique to Enhance Particle Filtering

In the low observation noise limit particle filters become inefficient. In this paper a simple-to-implement particle filter is suggested as a solution to this well-known problem. The proposed Local Importance Sampling based particle filters draw the particles’ positions in a two-step process that makes use of both the dynamics of the system and the most recent observation. Experiments with the ...

متن کامل

Group Importance Sampling for Particle Filtering and MCMC

Importance Sampling (IS) is a well-known Monte Carlo technique that approximates integrals involving a posterior distribution by means of weighted samples. In this work, we study the assignation of a single weighted sample which compresses the information contained in a population of weighted samples. Part of the theory that we present as Group Importance Sampling (GIS) has been employed implic...

متن کامل

Piecewise Constant Sequential Importance Sampling for Fast Particle Filtering

Particle filters are key algorithms for object tracking under non-linear, non-Gaussian dynamics. The high computational cost of particle filters, however, hampers their applicability in cases where the likelihood model is costly to evaluate, or where large numbers of particles are required to represent the posterior. We introduce the piecewise constant sequential importance sampling/resampling ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IEEE Transactions on Signal Processing

سال: 2021

ISSN: ['1053-587X', '1941-0476']

DOI: https://doi.org/10.1109/tsp.2021.3120512